Efficient Hessian computation using sparse matrix derivatives in RAM notation.

نویسندگان

  • Timo von Oertzen
  • Timothy R Brick
چکیده

This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P' entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ' K (2) + P (2) K (2) + K (3)). The naive implementation and numerical implementations are both O(P (2) K (3)), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Jacobian Computation Using ADIC2 and ColPack

Many scientific applications benefit from the accurate and efficient computation of derivatives. Automatically generating these derivative computations from an applications source code offers a competitive alternative to other approaches, such as less accurate numerical approximations or labor-intensive analytical implementations. ADIC2 is a source transformation tool for generating code for co...

متن کامل

ADMAT: Automatic differentiation in MATLAB using object oriented methods

Differentiation is one of the fundamental problems in numerical mathematics. The solution of many optimization problems and other applications require knowledge of the gradient, the Jacobian matrix, or the Hessian matrix of a given function. Automatic differentiation (AD) is an upcoming powerful technology for computing the derivatives accurately and fast. ADMAT (Automatic Differentiation for M...

متن کامل

Adifor Working Note #6: Structured Second-and Higher-order Derivatives through Univariate Taylor Series Structured Second-and Higher-order Derivatives through Univariate Taylor Series Mcs Preprint P296{0392

Second-and higher-order derivatives are required by applications in scientiic computation, especially for optimization algorithms. The two complementary concepts of interpolating partial derivatives from univariate Taylor series and preaccumulating of \local" derivatives form the mathematical foundations for accurate, eecient computation of second-and higher-order partial derivatives for large ...

متن کامل

Adifor Working Note #6: Structured Second-and Higher-order Derivatives through Univariate Taylor Series Mcs Preprint P296{0392

Second-and higher-order derivatives are required by applications in scientiic computation, especially for optimization algorithms. The two complementary concepts of interpolating partial derivatives from univariate Taylor series and preaccumulating of \local" derivatives form the mathematical foundations for accurate, eecient computation of second-and higher-order partial derivatives for large ...

متن کامل

Using an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints

In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Behavior research methods

دوره 46 2  شماره 

صفحات  -

تاریخ انتشار 2014